翻訳と辞書
Words near each other
・ Modo, Inc.
・ Modo, Jindo
・ Modo, Lamongan
・ Modo, Ongjin
・ Modoc
・ Modoc (novel)
・ Modoc County Arts Council
・ Modoc County Historical Museum
・ Modoc County, California
・ Modoc High School
・ Modified Newtonian dynamics
・ Modified nodal analysis
・ Modified Overt Aggression Scale
・ Modified pressure
・ Modified Rankin Scale
Modified Richardson iteration
・ Modified risk tobacco product
・ Modified starch
・ Modified Stave Notation
・ Modified stock car racing
・ Modified Toy Orchestra
・ Modified Transverse Mercator coordinate system
・ Modified triadan system
・ Modified universalism
・ Modified vaccinia Ankara
・ Modified waterfall models
・ Modified Wigner distribution function
・ Modified Wittig-Claisen tandem reaction
・ Modifier
・ Modifier key


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Modified Richardson iteration : ウィキペディア英語版
Modified Richardson iteration
Modified Richardson iteration is an iterative method for solving a system of linear equations. Richardson iteration was proposed by Lewis Richardson in his work dated 1910. It is similar to the Jacobi and Gauss–Seidel method.
We seek the solution to a set of linear equations, expressed in matrix terms as
: A x = b.\,
The Richardson iteration is
:
x^ = x^ + \omega \left( b - A x^ \right),

where \omega is a scalar parameter that has to be chosen such that the sequence x^ converges.
It is easy to see that the method has the correct fixed points, because if it converges, then x^ \approx x^ and x^ has to approximate a solution of A x = b.
== Convergence ==

Subtracting the exact solution x, and introducing the notation for the error e^ = x^-x, we get the equality for the errors
:
e^ = e^ - \omega A e^ = (I-\omega A) e^.

Thus,
:
\|e^\| = \|(I-\omega A) e^\|\leq \|I-\omega A\| \|e^\|,

for any vector norm and the corresponding induced matrix norm. Thus, if \|I-\omega A\|<1, the method converges.
Suppose that A is diagonalizable and that (\lambda_j, v_j) are the eigenvalues and eigenvectors of A. The error converges to 0 if | 1 - \omega \lambda_j |< 1 for all eigenvalues \lambda_j. If, e.g., all eigenvalues are positive, this can be guaranteed if \omega is chosen such that 0 < \omega < 2/\lambda_(A). The optimal choice, minimizing all | 1 - \omega \lambda_j |, is \omega = 2/(\lambda_(A)+\lambda_(A)), which gives the simplest Chebyshev iteration.
If there are both positive and negative eigenvalues, the method will diverge for any \omega if the initial error e^ has nonzero components in the corresponding eigenvectors.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Modified Richardson iteration」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.